Bias-Variance trade-off (theory)

As usual, we are given a dataset \(D=\{(x_1,y_1),\ldots,(x_n,y_n)\}\), drawn i.i.d. from some distribution \(P(X,Y)\). Throughout this lecture we assume a regression setting, i.e. \(y \in \mathbb{R}\). In this lecture we will decompose the generalization error of a classifier into three rather interpretable terms. Before we do that, let us consider that for any given input \(\mathbb{x}\) there might not exist a unique label \(y\). For example, if your vector \(\mathbb{x}\) describes features of house (e.g. #bedrooms, square footage, …) and the label \(y\) its price, you could imagine two houses with identical description selling for different prices. So for any given feature vector \(\mathbb{x}\), there is a distribution over possible labels. We therefore define the following, which will come in useful later on:

Expected Label (given \(\mathbb{x} \in \mathbb{R}^d\)):

\[\bar{y}(\mathbb{x}) = \mathbb{E}_{y|x}[Y] = \int_{y} y\,\text{Pr}(y|\mathbb{x}) \partial y.\]

The expected label denotes the label you would expect to obtain, given a feature vector \(\mathbb{x}\).

Alright, so we draw our training set \(D\), consisting of \(n\) inputs, i.i.d. from the distribution \(P\). As a second step we typically call some machine learning algorithm \(\mathcal{A}\) on this data set to learn a hypothesis (aka classifier). Formally, we denote this process as \(h_D=\mathcal{A}(D)\).

For a given \(h_D\), learned on data set \(D\) with algorithm \(\mathcal{A}\), we can compute the generalization error (as measured in squared loss) as follows:

Expected Test Error (given \(h_D\)):

\[\mathbb{E}_{(\mathbb{x},y)∼P} \Big[{(h_D(\mathbb{x})−y)}^{2}\Big]=\int_{x}\int_{y}{(h_D(\mathbb{x})−y)}^{2} \,\text{Pr}(\mathbb{x},y)\partial y \partial \mathbb{x}.\]

Note that one can use other loss functions. We use squared loss because it has nice mathematical properties, and it is also the most common loss function.

The previous statement is true for a given training set \(D\). However, remember that \(D\) itself is drawn from \(P^n\), and is therefore a random variable. Further, \(h_D\) is a function of \(D\), and is therefore also a random variable. And we can of course compute its expectation:

Expected Classifier (given \(\mathcal{A}\)):

\[\bar{h}=\mathbb{E}_{D∼P^n}[h_D]=\int_{D}h_D\, \text{Pr}(D)\partial D\]

where \(\text{Pr}(D)\) is the probability of drawing \(D\) from \(P^n\). Here, \(\bar{h}\) is a weighted average over functions.

We can also use the fact that \(h_D\) is a random variable to compute the expected test error only given \(\mathcal{A}\), taking the expectation also over \(D\).

Expected Test Error (given \(\mathcal{A}\)):

\[\mathbb{E}_{{(\mathbb{x},y)∼P}\atop{D∼P^n}} \, \Big[{(h_D(\mathbb{x})−y)}^2\Big] = \int_{D}\int_{\mathbb{x}}\int_{y} ({h_D(\mathbb{x})−y)}^2 \, \text{P}(\mathbb{x},y) \, \text{P}(D) \partial y \partial x \partial D\]

To be clear, \(D\) is our training points and the \((\mathbb{x},y)\) pairs are the test points.

We are interested in exactly this expression, because it evaluates the quality of a machine learning algorithm \(\mathcal{A}\) with respect to a data distribution \(P(X,Y)\). In the following we will show that this expression decomposes into three meaningful terms.

Decomposition of Expected Test Error

\[\begin{align*} \mathbb{E}_{\mathbb{x},y,D}\Big[[{h_D(\mathbb{x})−y]}^2 \Big] &= \mathbb{E}_{\mathbb{x},y,D}\Big[\big[{\big(h_D(\mathbb{x})−\bar{h}(\mathbb{x})\big)+\big(\bar{h}(\mathbb{x})−y\big)\big]}^2 \Big] \\ >&=\mathbb{E}_{\mathbb{x},D}[{(\bar{h}_D(\mathbb{x})−\bar{h}(\mathbb{x}))}^2] >+\mathbb{E}_{\mathbb{x},y}[{(\bar{h}(\mathbb{x})−y)}^2 ] + 2\, \mathbb{E}_{\mathbb{x},y,D}[(h_D(\mathbb{x})−\bar{h}(\mathbb{x}))(\bar{h}(\mathbb{x})−y)] >\end{align*}\]

The middle term of the above equation is 0 as we show below

\[\begin{align*} \mathbb{E}_{\mathbb{x},y,D} \Big[(h_D(x)−\bar{h}(\mathbb{x}))(\bar{h}(\mathbb{x})−y)\Big] &= \mathbb{E}_{\mathbb{x},y} \Big[\mathbb{E}_D[h_D(\mathbb{x})−\bar{h}(\mathbb{x})](\bar{h}(\mathbb{x})−y) \Big] \\ >&=\mathbb{E}_{\mathbb{x},y} \Big[(\mathbb{E}_D[h_D(\mathbb{x})]−\bar{h}(\mathbb{x}))(\bar{h}(\mathbb{x})−y) \Big] \\ >&=\mathbb{E}_{\mathbb{x},y} \Big[(\bar{h}(\mathbb{x})−\bar{h}(\mathbb{x}))(\bar{h}(\mathbb{x})−y) \Big] \\ >&=\mathbb{E}_{\mathbb{x},y}[0] \\ >&=0 >\end{align*}\]

Returning to the earlier expression, we’re left with the variance and another term

\[\begin{align*} \mathbb{E}_{x,y,D} \Big[({h_D(\mathbb{x})−y)}^2 \big] >&= \underbrace {\mathbb{E}_{\mathbb{x},D} \Big[ {(h_D(\mathbb{x})−\bar{h}(\mathbb{x}))}^2 \Big]}_{\text{Variance}}+\mathbb{E}_{\mathbb{x},y} \Big[ {(\bar{h}(\mathbb{x})−y)}^2 \Big] >\end{align*}\]

We can break down the second term in the above equation as follows:

\[\begin{align*} \mathbb{E}_{\mathbb{x},y} [{(\bar{h}(\mathbb{x})−y)}^2] >&=\mathbb{E}_{\mathbb{x},y} [{(\bar{h}(\mathbb{x})−\bar{y}(\mathbb{x}))+(\bar{y}(\mathbb{x})−y)}^2] \\ >&=\underbrace{ \mathbb{E}_{\mathbb{x},y} [{(\bar{y}(\mathbb{x})−y)}^2]}_{\text{Noise}} + \underbrace{ \mathbb{E}_{\mathbb{x}}[{(\bar{h}(\mathbb{x})−\bar{y}(\mathbb{x}))}^2] }_{\text{Bais}^2} + 2 \, \mathbb{E}_{\mathbb{x},y} [(\bar{h}(\mathbb{x})−\bar{y}(\mathbb{x}))(\bar{y}(\mathbb{x})−y)] >\end{align*}\]

The third term in the equation above is \(0\), as we show below

\[\begin{align*} \mathbb{E}_{\mathbb{x},y}[(\bar{h}(\mathbb{x})−\bar{y}(\mathbb{x}))(\bar{y}(\mathbb{x})−y)] >&=\mathbb{E}_{\mathbb{x}} [\mathbb{E}_{y∣\mathbb{x}} [\bar{y}(\mathbb{x})−y](\bar{h}(\mathbb{x})−\bar{y}(\mathbb{x}))] \\ >&=\mathbb{E}_{\mathbb{x}} [\mathbb{E}_{y∣\mathbb{x}} [\bar{y}(\mathbb{x})−y](\bar{h}(\mathbb{x})−\bar{y}(\mathbb{x}))] \\ >&=\mathbb{E}_{\mathbb{x}} [(\bar{y}(\mathbb{x})−\mathbb{E}_{y∣\mathbb{x}}[y])(\bar{h}(\mathbb{x})−\bar{y}(\mathbb{x}))] \\ >&=\mathbb{E}_{\mathbb{x}} [(\bar{y}(\mathbb{x})−\bar{y}(\mathbb{x}))(\bar{h}(\mathbb{x})−\bar{y}(\mathbb{x}))] \\ >&=\mathbb{E}_{\mathbb{x}}[0] \\ >&=0 >\end{align*}\]

This gives us the decomposition of expected test error as follows

\[\begin{align*} \underbrace{ \mathbb{E}_{\mathbb{x},y,D} \Big[{(h_D(\mathbb{x})−y)}^2 \Big] }_{\text{Expected Test Error}} = \underbrace{\mathbb{E}_{\mathbb{x},D} \Big[ {(h_D(\mathbb{x})−\bar{h}(\mathbb{x}))}^2\Big]}_{\text{Variance}} + \underbrace{ \mathbb{E}_{\mathbb{x}} \Big[{(\bar{h}(\mathbb{x})−\bar{y}(\mathbb{x}))}^2\Big] }_{\text{Bias}^2} + \underbrace{ \mathbb{E}_{\mathbb{x},y}\Big[(\bar{y}(x)−y)^2\Big] }_{\text{Noise}} >\end{align*}\]

Variance: Captures how much your classifier changes if you train on a different training set. How “over-specialized” is your classifier to a particular training set (overfitting)? If we have the best possible model for our training data, how far off are we from the average classifier?

Bias: What is the inherent error that you obtain from your classifier even with infinite training data? This is due to your classifier being “biased” to a particular kind of solution (e.g. linear classifier). In other words, bias is inherent to your model.

Noise: How big is the data-intrinsic noise? This error measures ambiguity due to your data distribution and feature representation. You can never beat this, it is an aspect of the data.

Bias-Variance trade-off (my notes)

En esta colab notebook vamos a explorar los conceptos teóricos previamente expuestos con un “toy example”. ¿Por qué un “toy example” y no data real?

En la práctica, nunca vamos a conocer el verdadero proceso que genera los datos (si lo conocieramos, no necesitaríamos utilizar técnicas de machine learning). El valor de simular ese proceso, es que nos va permitir conocer exactamente su distribución y eso nos va ayudar a entender que tan buenas son nuestras estimaciones y mejorar nuestra intuición sobre los distintos conceptos y métodos

The (sintetic) trianing datasets

En vez de jugar con datos reales, vamos a crear nuestros datos ya que conocer exactamente como fueron creados nos va ayudar a experimentar con los conceptos que queremos aprender.

Una manera de obtener una fuente de datos infinita es creando una “maquina generadora de datos”. Esa “máquina” la podemos especificar describiendo el proceso a través del cual la “máquina” va a poder generar datos.

Una manera posible, podría ser comenzar con una función \(f\), donde

\[f(x) = (0.45x-2)^3 - 0.55(0.2x-3)^2- 2.5(0.5x - 3)+ 5\]

y agregarle ruido \(\epsilon\) para tener entonces una variable aletoria $\( Y = f(x) + \epsilon.\)$

Para nuestro ejercicio, tomemos \(\epsilon \sim N(\mu=0,\sigma^2 = 0.5)\).

Para hacerlo todavía más entretenido, vamos a hacer que las \(x\) de nuestro dataset también tengan una distribución, por ejemplo, \(X \sim U(a=0,b=10) \).

Nota: Elegí una \(f(x)\) con una forma “bastante curva” para que sean más entretenidos nuestros experimentos y nos restringimos a \(x \in [0,10]\) (cómo \(\mathbb{x}\) es un escalar, usamos \(x\) en vez de \(\mathbb{x}\)) para facilitar la visualización.

Con todo esto, nos queda entonces el siguiente proceso de generación de datos:

\( Y = f(X) + \epsilon\) , con \(X \sim U(a=0,b=10) \) y \(\epsilon \sim N(\mu=0,\sigma^2 = 0.5)\).

Para “usar la máquina” en este caso y generar una muestra \((x_0,y_0)\),empezamos sampleando \(x_0\) de la distribución uniforme, después pasamos \(x_0\) por \(f\) y obtemos \(f(x_0)\), sampleamos \(\epsilon_0\) de la normal (con \(\mu=0\) y \(\sigma^2 = 0.5\) ) y finalemente obtenemos \(y_0 = f(x_0) + \epsilon_0\).

Este mismo proceso se puede escribir de la siguiente manera:

\[ Y|X \sim \mathcal{N}(y|\mu =f(x),{\sigma}^2 = 0.5)\]
\[ X \sim \mathcal{U}(x|a=0,b=10)\]

Esto se conoce como un modelos jerarquico ([CB01], Pag 162) y es para mí una forma muy simple e intuitiva de describir un proceso de generación de datos.

Para “usar la máquina” descripta de esta manera y generar una muestra \((x_0,y_0)\), empezamos sampleando \(x_0\) de la distribución uniforme. Después, “pasamos” \(x_0\) por \(f\) para obtener \(f(x_0)\). Finalmente sampleamos \(y_0\) de una normal con \(\mu=f(x_0)\) y \(\sigma^2 = 0.5\). A este proceso de generar muestras se lo conoce como ancestral sampling ([Bis07], Pag 365).

Noten que las dos descripciones,aunque son ligeramente diferentes, son completamente equivalentes (van a generar datos con exactamente la misma distribución) y me permiten generar tantos pares \((x,y)\) como quiera.

Data distribution \(Pr(X,Y)\)

En este caso, podemos escribir \(\text{Pr}(X,Y)\) de manera analítica,

\[\begin{align*} \text{Pr}(X,Y) &= \text{Pr}(Y|X) \, \text{Pr}(X) \\ &= \mathcal{N}(y|\mu =f(x),{\sigma}^2 = 0.5)\, \mathcal{U}(x|a=0,b=10) \\ &= \frac{1}{\sqrt{2\pi\sigma^2}}\text{exp} \Big\{-\frac{1}{2\sigma^2}(y-f(x))^2\Big\} \frac{1}{b-a} \\ &= \frac{1}{10\sqrt{\pi}}\text{exp} \Big\{-(y-f(x))^2\Big\}. \end{align*}\]

Dónde usamos que \(\mathcal{N}(y | \mu,\sigma^2) = \frac{1}{\sqrt{2\pi\sigma^2}}\text{exp} \Big\{-\frac{1}{2\sigma^2}(y-\mu)^2\Big\}\) y \(\mathcal{U}(x|a,b) = \frac{1}{b-a}\).

Plot de Pr(X,Y) en Geogebra.

Bias-Variance tradeoff (practice)

%%capture
!pip install seaborn==0.11.0
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt

from ipywidgets import interact, interactive, fixed, interact_manual
import ipywidgets as widgets

import plotly.graph_objects as go
import plotly.express as px

from sklearn.linear_model import LinearRegression
from sklearn import tree

from scipy.interpolate import interp1d
from numpy.random import default_rng

The trianing datasets

Generamos los datos usando la descripción de función con ruido.

# Create d datasets of n samples each
# X 
d = 5 # number of datasets to create
n = 100 # samples per dataset
a, b = 0,10 # x domain range (a,b)
f = np.vectorize(lambda x :  (0.45*x-2)**3 - 0.55*(0.2*x-3)**2-2.5*(0.5*x-3)+ 5) # True function that we would like to learn
var = 0.5 # Noise to add to the true function
# Seed for the random generator
rng = default_rng(12345)
def generate_datasets(d:int,n:int,f:np.vectorize,a:float,b:float,var:float):
  """
  Creates datasets.

  Parameters
  ----------

  d: number of datasets
  n: number of samples per dataset
  f: function to use to generate the data
  a: min of range x
  b: max of range x
  var: variance of the normal use to generate the noise

  Returns
  -------
  X: datasets with features
  Y: datasets with targets

  """
  X = rng.uniform(low=a,high=b,size=(d,n))
  noise = rng.normal(loc=0,scale=np.sqrt(var),size=(d,n))
  Y = f(X) + noise # True labels 
  return X,Y
#x_datasets, y_datasets_w_noise = generate_datasets(d,n,f,a,b,var)
X_trains, y_trains = generate_datasets(d,n,f,a,b,var)
d_test = 1
n_test = 20
X_tests, y_tests = generate_datasets(d_test,n_test,f=f,a=a,b=b,var=var)

True \(f\) (for future reference)

true_f_x = np.linspace(a,b,1000)
true_f_y = f(true_f_x)

The test dataset

d_test = 1
n_test = 20

#x_test, y_test_w_noise = generate_datasets(d_test,n_test,f=f,a=a,b=b,var=var)
X_tests, y_tests = generate_datasets(d_test,n_test,f=f,a=a,b=b,var=var)

Save

def to_Xy_df(X,y):
  """
  Converts a numpy dataset into a dataframe dataset.
  Assumes X and y with shape (samples).
  """
  Xy = np.concatenate([X.reshape(-1,1), y.reshape(-1,1)],axis=1)
  Xy_df = pd.DataFrame(data = Xy.copy(), columns = ["X","y"])
  return Xy_df


Xy_train_df = to_Xy_df(X_trains[0],y_trains[0]) 
Xy_test_df = to_Xy_df(X_tests[0],y_tests[0])
Xy_train_df.to_csv("Xy_train.csv",index=False)
Xy_test_df.to_csv("Xy_test.csv",index=False)

Ploting the datasets

Ploting the datasets (no hover)

def plot_f(i: int,x_datasets: np.array,y_datasets_w_noise: np.array,true_f_x: np.array,true_f_y: np.array):
  """
  Plot the function and the data coming from it for specific dataset i in x_datasets.
  i: index to indicate which dataset to plot from x_datasets.
  x_datasets: input features of the d datatsets.
  y_datasets_w_noise: targets of the d datasets.
  true_f_x: x coordinates of the points from the true function.
  true_f_y: y cordinates of the points from the true function.
  """
  fig,ax = plt.subplots(figsize=(7,7))
  ax.set_ylim((-4,18))
  ax.set_xlim((0,10))
  ord = x_datasets[i,:].argsort()
  ax.plot(true_f_x,true_f_y,color='blue')
  ax.scatter(x_datasets[i,:],y_datasets_w_noise[i,:],color = 'orange')
  return ax
#t =  plot_f(1,X_trains,y_trains,true_f_x,true_f_y)
#type(t)
interact(lambda i : plot_f(i,X_trains,y_trains,true_f_x,true_f_y), i=widgets.IntSlider(min=0, max=d-1, step=1, value=0));

Ploting datasets (with hover)

def f_plotly(i: int,x_datasets: np.array,y_datasets_w_noise: np.array,true_f_x: np.array,true_f_y: np.array):
  """
  Plot the function and the data coming from it for specific dataset i in x_datasets (using plotly).
  i: index to indicate which dataset to plot from x_datasets.
  x_datasets: input features of the d datatsets.
  y_datasets_w_noise: targets of the d datasets.
  true_f_x: x coordinates of the points from the true function.
  true_f_y: y cordinates of the points from the true function.
  """
  fig = go.Figure(
      layout=go.Layout(
        title=go.layout.Title(text=f"Samples and True Function. Dataset {i}"),
        xaxis_range = (0,10),
        yaxis_range = (-4,20),
        height=600,
        width=600,
        ))
  fig.add_trace(go.Scatter(x=x_datasets[i,:],y=y_datasets_w_noise[i,:],mode='markers',name='samples'))
  fig.add_trace(go.Scatter(x=true_f_x,y=true_f_y,mode='lines',name='true function'))

  return fig
interact(lambda i : f_plotly(i,X_trains,y_trains,true_f_x,true_f_y), i=widgets.IntSlider(min=0, max=d-1, step=1, value=0));

Ploting P(X,Y)

https://www.geogebra.org/3d/uyqknepp

Ploting P(X,Y) (no hover)

s = 1000000
a = 0 # lower for the uniform
b = 10 #upper for the uniform
x = np.random.uniform(low=a,high=b,size=(1,s))
y_dn = np.apply_along_axis(func1d= f ,axis=1,arr=x)
y_dn += np.random.normal(loc=0,scale=np.sqrt(0.5),size=(1,s))
data = pd.DataFrame(x.T,columns=["x"])
data["y"] = y_dn.T
import seaborn as sns
f_dist = sns.displot(data,x="x", y="y")
_images/002_bias-vs-variance_44_0.png

Ploting P(X,Y) (with hover)

s = 10000000
x = np.random.uniform(10,size=(1,s))
y_dn = np.apply_along_axis(func1d= f ,axis=1,arr=x)
y_dn += np.random.normal(loc=0,scale=np.sqrt(2),size=(1,s))
xedges = np.linspace(0,10,100)
yedges = np.linspace(0,20,100)
H, xedges, yedges = np.histogram2d(x.ravel(), y_dn.ravel(), bins=(xedges, yedges))
H /= s # normalize
H = H.T  # Let each row list bins with common y range.

X, Y = np.meshgrid(xedges + ((xedges[1] - xedges[0]) / 2 )
                   ,yedges + ((yedges[1] - yedges[0]) / 2 ))
Z = np.random.randint(100,size=(200,200))
fig = go.Figure(data=[go.Surface(x=X,y=Y,z=H)],
                layout=go.Layout(xaxis_range = (0,10),yaxis_range = (-4,20),height=600,width=600,))
fig.show()

Average function (from samples)

def average_f(X: np.array,Y: np.array,a: float,b: float):
  """
  Given datasets with samples computes the average function.
  X: matrix of x with shape (datasets,samples)
  Y: matrix of y with shape (datasets,samples)
  a: min x range
  b: max x range

  """
  assert X.shape == Y.shape
  
  datasets, samples = X.shape

  fs = [interp1d(X[i,:], Y[i,:],'linear',bounds_error=False) for i in range(datasets)]

  # define common carrier for calculation of average curve
  x_all = np.linspace(a,b, num=101)
  
  # evaluation of fits on common carrier
  f_ints = [f(x_all) for f in fs]

  # put all fits to one matrix for fast mean calculation
  data_collection = np.vstack(f_ints)

  # calculating mean value
  f_avg_points = np.average(data_collection, axis=0)

  f_avg = interp1d(x_all,f_avg_points,bounds_error=False)

  return f_avg_points , f_avg
#f_avg_points , f_avg = average_f(x_datasets,y_datasets_w_noise,a,b)
#plt.scatter(np.linspace(0, 10, num=101),f_avg_points,color='orange')
#plt.plot(true_f_x,true_f_y)

Plot Bias Variance

def plot_bias_variance(x_datasets,preds,h_avg,y_datasets_w_noise,f_avg):
  x_domain = np.linspace(0, 10, num=101)

  fig, ax = plt.subplots(figsize = (16,9))
  
  ax.scatter(x_datasets.ravel(),preds.ravel(),alpha=0.1,color='orange')
  ax.plot(x_domain,h_avg(x_domain),color='orange')
  ax.scatter(x_datasets.ravel(),y_datasets_w_noise.ravel(),alpha=0.1,color='blue')
  ax.plot(x_domain,f_avg(x_domain),color='violet')

  return fig, ax
#fig , ax = plot_bias_variance(x_datasets,preds,h_avg,y_datasets_w_noise,f_avg)
\[\begin{align*} \underbrace{ \mathbb{E}_{\mathbb{x},y,D} \Big[{(h_D(\mathbb{x})−y)}^2 \Big] }_{\text{Expected Test Error}}= \underbrace{\mathbb{E}_{\mathbb{x},D} \Big[ {(h_D(\mathbb{x})−\bar{h}(\mathbb{x}))}^2\Big]}_{\text{Variance}} + \underbrace{ \mathbb{E}_{\mathbb{x}} \Big[{(\bar{h}(\mathbb{x})−\bar{y}(\mathbb{x}))}^2\Big] }_{\text{Bias}^2} + \underbrace{ \mathbb{E}_{\mathbb{x},y}\Big[(\bar{y}(x)−y)^2\Big] }_{\text{Noise}} \end{align*}\]
def avg_square_diff(a,b,sel=None):
  """
  Average square difference between two tensors with same shape
  """
  a.shape == b.shape
  if sel is None:
    sel = np.ones(a.shape, dtype=bool).ravel()
  return np.nanmean((a.ravel()[sel] - b.ravel()[sel] )**2)
def compute_error_decomposition_test(d,x_test,preds_test,h_avg,f_avg,y_test_w_noise,x_min,x_max):
  sel = (np.tile(x_test,(d,1)).ravel() > x_min) & ((np.tile(x_test,(d,1)).ravel() < x_max))
  variance = avg_square_diff(preds_test,h_avg(np.tile(x_test,(d,1))),sel)
  
  sel = (x_test.ravel() > x_min) & (x_test.ravel() < x_max)
  bias2 = avg_square_diff(h_avg(x_test),f_avg(x_test),sel)
  noise = avg_square_diff(f_avg(x_test),y_test_w_noise,sel)

  sel = (np.tile(x_test,(d,1)).ravel() > x_min) & ((np.tile(x_test,(d,1)).ravel() < x_max))
  test_error = avg_square_diff(preds_test,np.tile(y_test_w_noise,(d,1)),sel)
  return {"test_error":test_error,"variance":variance,"bias2":bias2,"noise":noise}
def display_result(test_error,variance,bias2,noise):
  res = f"Test Error: {test_error} = Variance: {variance} + Bias^2: {bias2} + Noise: {noise} \n {test_error} = {variance + bias2 + noise}"
  print(res)
#display_result(**compute_error_decomposition_test(x_test,preds_test,h_avg,y_test_w_noise))

Train different models and compute metrics

def complexity_analysis(x_datasets,y_datasets_w_noise,x_test,y_test_w_noise,a,b):
  
  assert x_datasets.shape == y_datasets_w_noise.shape
  # Infer d,n from x_dataset
  d,n = x_datasets.shape

  results = {}
  for max_depth in range(1,10):
    models = []
    preds = np.zeros([d,n])
    for i in range(d):
      reg = tree.DecisionTreeRegressor(max_depth=max_depth).fit(x_datasets[i,:].reshape(-1,1), y_datasets_w_noise[i,:].reshape(-1,1))
      pred = reg.predict(x_datasets[i,:].reshape(-1,1)).ravel()
      models.append(reg)
      preds[i,:] = pred

    # Use the d models that we train with the training data and make them predict
    # for the test set
    preds_test = np.zeros((d,n_test))
    for i in range(d):
      preds_test[i,:] = models[i].predict(x_test.reshape(-1,1)).ravel()

    f_avg_points , f_avg = average_f(x_datasets,y_datasets_w_noise,a,b)
    f_avg = np.vectorize(lambda x :  (0.45*x-2)**3 - 0.55*(0.2*x-3)**2-2.5*(0.5*x-3)+ 5) # True function that we would like to learn
    h_avg_points , h_avg = average_f(np.tile(x_test,(d,1)),preds_test,a,b)

    fig,ax = plot_bias_variance(x_datasets,preds,h_avg,y_datasets_w_noise,f_avg)

    x_min = a
    x_max = b
    results[max_depth] = {"plot":(fig,ax),"values":compute_error_decomposition_test(d,x_test,preds_test,h_avg,f_avg,y_test_w_noise,x_min,x_max)}
  return results
def complexity_analysis_results_to_numpy(results):
  """
  Transforms results into a tuple of arrays for plotting.
  results: output of complexity_analysis function
  """
  inds = np.array([i for i,d in results.items()])
  test_errors = np.array([d["values"]["test_error"] for i,d in results.items()])
  vars = np.array([d["values"]["variance"] for i,d in results.items()])
  biases = np.array([d["values"]["bias2"] for i,d in results.items()])
  noises = np.array([d["values"]["noise"] for i,d in results.items()])
  test_errors_min_ind = np.argmin(test_errors)

  return (inds,test_errors,vars,biases,noises,test_errors_min_ind)
def plot_bias_var_to(inds,test_errors,vars,biases,noises,test_errors_min_ind):
  fig, ax = plt.subplots(figsize = (16,9))

  ax.plot(inds,test_errors,label='test error')
  ax.plot(inds[test_errors_min_ind], test_errors[test_errors_min_ind], 'ro', label= 'Minimum Test Error')
  ax.plot(inds,vars,'--',label='variance')
  ax.plot(inds,biases,'--',label='bias')
  ax.plot(inds,noises,'--',label='noise')
  plt.xlabel("Complexity (tree: max_depth)")
  plt.title("Bias-Variance Tradeoff")
  plt.legend()

Experiment 1

d = 5 # number of training datasets 
n = 100 # samples per training dataset
a, b = 0,10 # x domain range (a,b)
f = np.vectorize(lambda x :  (0.45*x-2)**3 - 0.55*(0.2*x-3)**2-2.5*(0.5*x-3)+ 5) # True function that we would like to learn
var = 0.5 # Noise to add to the true function

d_test = 1 # number of test datasets 
n_test = 20 # samples per test dataset
x_trains, y_trains = generate_datasets(d,n,f,a,b,var)
x_tests, y_tests = generate_datasets(d_test,n_test,f=f,a=a,b=b,var=var)
results = complexity_analysis(x_trains,y_trains,x_tests,y_tests,a,b)
_images/002_bias-vs-variance_65_0.png _images/002_bias-vs-variance_65_1.png _images/002_bias-vs-variance_65_2.png _images/002_bias-vs-variance_65_3.png _images/002_bias-vs-variance_65_4.png _images/002_bias-vs-variance_65_5.png _images/002_bias-vs-variance_65_6.png _images/002_bias-vs-variance_65_7.png _images/002_bias-vs-variance_65_8.png

Experiment 1: Results

Vemos que a medida que el modelo es más complejo:

  • El sesgo (bias) se reduce.

  • La varianza (variance) aumenta.

  • El ruido se mantiene igual.

plot_bias_var_to(*complexity_analysis_results_to_numpy(results))
_images/002_bias-vs-variance_68_0.png

Experiment 2

Veamos que pasa cuando usamos más data.

d = 5 # number of datasets to create
n = 500 # samples per dataset
a, b = 0,10 # x domain range (a,b)
f = np.vectorize(lambda x :  (0.45*x-2)**3 - 0.55*(0.2*x-3)**2-2.5*(0.5*x-3)+ 5) # True function that we would like to learn
var = 0.5 # Noise to add to the true function

d_test = 1
n_test = 200
x_trains, y_trains = generate_datasets(d,n,f,a,b,var)
x_tests, y_tests = generate_datasets(d_test,n_test,f=f,a=a,b=b,var=var)
results =  complexity_analysis(x_trains,y_trains,x_tests,y_tests,a,b)
_images/002_bias-vs-variance_73_0.png _images/002_bias-vs-variance_73_1.png _images/002_bias-vs-variance_73_2.png _images/002_bias-vs-variance_73_3.png _images/002_bias-vs-variance_73_4.png _images/002_bias-vs-variance_73_5.png _images/002_bias-vs-variance_73_6.png _images/002_bias-vs-variance_73_7.png _images/002_bias-vs-variance_73_8.png

Experiment 2: Results

Vemos que el aumentar la cantidad de nuestros datos, la performance de nuestro modelo mejora (pero el limite a esa mejora es el ruido).

plot_bias_var_to(*complexity_analysis_results_to_numpy(results))
_images/002_bias-vs-variance_76_0.png

Appendix

Expectation of a function

The expectation of a function \(f(x)\) can be approaximated by

\[ \mathbb{E}_{x}[f] \simeq \frac{1}{N} \sum_{n=1}^{N} f(x_n)\]

,with this approximation becoming exact in the limit \(N \to \infty\) ([Bis07] Page 20).

Notación

Muchos de mis dificultades al estudiar ML vienen de la notación. Aprender a “leer matemáticas”, para poder después “hablar” y escribir esa lengua es fundamental.

Ejemplo 1:

\(X \sim \mathcal{N}(\mu,\sigma^2)\), se lee, la variable aleatoria \(X\) tiene una distribución normal con media \(\mu\) y desvio standard \(\sigma\) (varianza \(\sigma^2\)). Por otra parte, cuando quiero describir la densidad de probabilidad \(p(x)\), puedo usar \(p(x) = \mathcal{N}(x | \mu,\sigma^2)\). En este caso, \(p(x) = \frac{1}{\sqrt{2\pi\sigma^2}}\text{exp} \Big\{-\frac{1}{2\sigma^2}(x-\mu)^2\Big\}\).

References

Bis07(1,2)

Christopher Bishop. Pattern Recognition and Machine Learning. Springer, 2007. ISBN 978-0387310732.

CB01

George Casella and Roger Berger. Statistical Inference. Duxbury Resource Center, June 2001. ISBN 0534243126.

Bias-Variance trade-off with other loss functions and for other problems ( classification)

https://homes.cs.washington.edu/~pedrod/bvd.pdf

https://web.engr.oregonstate.edu/~tgd/classes/534/slides/part9.pdf

https://web.engr.oregonstate.edu/~tgd/classes/534/slides/part9.pdf

https://people.eecs.berkeley.edu/~jrs/189s19/lec/12.pdf

http://cs229.stanford.edu/summer2020/BiasVarianceAnalysis.pdf

https://towardsdatascience.com/the-bias-variance-tradeoff-8818f41e39e9

\n "}}, {"output_type": "display_data", "metadata": {}, "data": {"application/vnd.plotly.v1+json": {"config": {"plotlyServerURL": "https://plot.ly"}, "data": [{"mode": "markers", "name": "samples", "type": "scatter", "x": [2.2733602246716966, 3.1675833970975287, 7.973654573327341, 6.762546707509745, 3.91109550601909, 3.328139278663845, 5.983087535871898, 1.8673418560371335, 6.727560440146213, 9.418028652699372, 2.48245714629571, 9.488811518333183, 6.672374531003724, 0.9589793559411208, 4.418396661678128, 8.864799193275177, 6.974534998820221, 3.2647286407011213, 7.339281633300665, 2.201349555454862, 0.8159456954220812, 1.5989560107504752, 3.401001849547053, 4.651931537020509, 2.6642102829077094, 8.15776403424807, 1.932943892894945, 1.2946907617720027, 0.9166475154493592, 5.9856801366491315, 8.547419043740014, 6.016212416937131, 9.319883611359835, 7.2478136109202005, 8.605513173932923, 9.293378015753163, 5.461860090823531, 9.37672958767757, 4.949879400788243, 2.7377318248998748, 4.5177870747476065, 6.650389233995303, 3.3089093046705464, 9.034540068082391, 2.5707417527653433, 3.3982833761031985, 2.5885339864292733, 3.55446479944286, 0.05022333717131788, 6.286045440996787, 2.8238270742511826, 0.6808768948794575, 6.1682897725638055, 1.7632632028120343, 3.043883872195896, 4.4088681087611805, 1.502023410627008, 2.17928863085435, 4.743331153335445, 4.763688550811919, 2.5523235381950027, 2.9756526814804807, 2.7906711981376664, 2.6057921249129756, 4.827615927993158, 2.119790363515106, 4.956305966730406, 2.4626132583073757, 8.384826524669448, 1.8013059009503507, 8.621562915092365, 1.7829944484518745, 7.505313319372441, 6.111204038305652, 2.091550349286073, 7.5987242112399525, 2.492605695349125, 0.8557173198655799, 6.180567223180909, 5.369683310323356, 6.345267112152757, 1.7437410869138825, 2.4816448985645243, 6.848229846393991, 0.8087164625090748, 8.750736007561262, 4.286943815399918, 6.183941953973778, 3.1310550418511984, 1.789628552928676, 0.09712127795452608, 2.10042958448453, 8.700006787716521, 9.728298023975587, 4.417923431911024, 3.7874949480925335, 2.7594708126815015, 9.661041092344417, 0.5820260526584398, 4.087338988618549], "y": [6.054685111078699, 4.581428216625668, 5.392507052220794, 3.1080675129768505, 4.225743591355715, 5.404029322210892, 3.80463914702643, 5.753329701180897, 4.649222806665851, 11.237918383092238, 4.669660943003897, 13.930549212798873, 5.160353887865788, 3.0300355981689893, 3.8559364515218997, 8.161942283169342, 4.039384719056993, 6.03983853569104, 4.930849094382604, 5.424034531791298, 2.5259021041605063, 4.569832341022067, 4.965769774838419, 4.380241720895644, 5.2347390655805714, 6.31008268126745, 3.552542981066117, 3.6573007449069403, 2.783892713842448, 2.7643569577513687, 7.071709402987695, 2.813018560637876, 10.230794625455566, 3.9208819093804292, 7.229785120184925, 10.1501695278065, 4.5474441885661445, 11.159742985896985, 4.33024542000133, 5.481644407461004, 3.9631193173186925, 3.006917338571963, 5.81715741764601, 8.986826737549993, 4.036055828481662, 3.631836514574645, 5.449644905426189, 4.589484743872905, -0.25932339377199065, 3.1540966075962835, 5.032564804920202, 2.1581725589321983, 2.6584089574088465, 4.671637346834923, 5.99722066862099, 5.013041112691791, 3.062661433645985, 4.151967798258154, 3.3164349415249736, 4.067737456263263, 5.887989945350624, 5.539202643885522, 4.079999039826361, 5.335592087100507, 4.76503000585442, 4.738210601960448, 5.444029207045662, 4.7936699714728945, 7.460404128095968, 6.395750826419042, 6.566947115527144, 4.481334130311287, 4.902177096577251, 5.8155204580153885, 3.768124860486232, 5.05653378460235, 5.466462556634415, 2.518638400006851, 2.949595243015993, 3.447848024448515, 3.3320665756982906, 3.7613571075498493, 6.375304110171022, 2.959672317537496, 2.376796451421697, 7.511593538766557, 4.3238189413467625, 3.605547442379282, 4.744846866619307, 4.551624821369316, 1.2249464348244041, 5.786937268268037, 7.570915618057535, 13.812504324441113, 3.970619064522452, 5.1337064079088695, 4.308327093921722, 11.923565354571549, 1.3357248685073757, 5.641850626615588]}, {"mode": "lines", "name": "true function", "type": "scatter", "x": [0.0, 0.01001001001001001, 0.02002002002002002, 0.03003003003003003, 0.04004004004004004, 0.050050050050050046, 0.06006006006006006, 0.07007007007007007, 0.08008008008008008, 0.09009009009009009, 0.10010010010010009, 0.11011011011011011, 0.12012012012012012, 0.13013013013013014, 0.14014014014014015, 0.15015015015015015, 0.16016016016016016, 0.17017017017017017, 0.18018018018018017, 0.19019019019019018, 0.20020020020020018, 0.21021021021021022, 0.22022022022022023, 0.23023023023023023, 0.24024024024024024, 0.2502502502502503, 0.2602602602602603, 0.2702702702702703, 0.2802802802802803, 0.2902902902902903, 0.3003003003003003, 0.3103103103103103, 0.3203203203203203, 0.3303303303303303, 0.34034034034034033, 0.35035035035035034, 0.36036036036036034, 0.37037037037037035, 0.38038038038038036, 0.39039039039039036, 0.40040040040040037, 0.41041041041041043, 0.42042042042042044, 0.43043043043043044, 0.44044044044044045, 0.45045045045045046, 0.46046046046046046, 0.47047047047047047, 0.4804804804804805, 0.4904904904904905, 0.5005005005005005, 0.5105105105105106, 0.5205205205205206, 0.5305305305305306, 0.5405405405405406, 0.5505505505505506, 0.5605605605605606, 0.5705705705705706, 0.5805805805805806, 0.5905905905905906, 0.6006006006006006, 0.6106106106106106, 0.6206206206206206, 0.6306306306306306, 0.6406406406406406, 0.6506506506506506, 0.6606606606606606, 0.6706706706706707, 0.6806806806806807, 0.6906906906906907, 0.7007007007007007, 0.7107107107107107, 0.7207207207207207, 0.7307307307307307, 0.7407407407407407, 0.7507507507507507, 0.7607607607607607, 0.7707707707707707, 0.7807807807807807, 0.7907907907907907, 0.8008008008008007, 0.8108108108108109, 0.8208208208208209, 0.8308308308308309, 0.8408408408408409, 0.8508508508508509, 0.8608608608608609, 0.8708708708708709, 0.8808808808808809, 0.8908908908908909, 0.9009009009009009, 0.9109109109109109, 0.9209209209209209, 0.9309309309309309, 0.9409409409409409, 0.950950950950951, 0.960960960960961, 0.970970970970971, 0.980980980980981, 0.990990990990991, 1.001001001001001, 1.011011011011011, 1.021021021021021, 1.031031031031031, 1.0410410410410411, 1.0510510510510511, 1.0610610610610611, 1.0710710710710711, 1.0810810810810811, 1.0910910910910911, 1.1011011011011012, 1.1111111111111112, 1.1211211211211212, 1.1311311311311312, 1.1411411411411412, 1.1511511511511512, 1.1611611611611612, 1.1711711711711712, 1.1811811811811812, 1.1911911911911912, 1.2012012012012012, 1.2112112112112112, 1.2212212212212212, 1.2312312312312312, 1.2412412412412412, 1.2512512512512513, 1.2612612612612613, 1.2712712712712713, 1.2812812812812813, 1.2912912912912913, 1.3013013013013013, 1.3113113113113113, 1.3213213213213213, 1.3313313313313313, 1.3413413413413413, 1.3513513513513513, 1.3613613613613613, 1.3713713713713713, 1.3813813813813813, 1.3913913913913913, 1.4014014014014013, 1.4114114114114114, 1.4214214214214214, 1.4314314314314314, 1.4414414414414414, 1.4514514514514514, 1.4614614614614614, 1.4714714714714714, 1.4814814814814814, 1.4914914914914914, 1.5015015015015014, 1.5115115115115114, 1.5215215215215214, 1.5315315315315314, 1.5415415415415414, 1.5515515515515514, 1.5615615615615615, 1.5715715715715715, 1.5815815815815815, 1.5915915915915915, 1.6016016016016015, 1.6116116116116117, 1.6216216216216217, 1.6316316316316317, 1.6416416416416417, 1.6516516516516517, 1.6616616616616617, 1.6716716716716717, 1.6816816816816818, 1.6916916916916918, 1.7017017017017018, 1.7117117117117118, 1.7217217217217218, 1.7317317317317318, 1.7417417417417418, 1.7517517517517518, 1.7617617617617618, 1.7717717717717718, 1.7817817817817818, 1.7917917917917918, 1.8018018018018018, 1.8118118118118118, 1.8218218218218218, 1.8318318318318318, 1.8418418418418419, 1.8518518518518519, 1.8618618618618619, 1.8718718718718719, 1.8818818818818819, 1.8918918918918919, 1.901901901901902, 1.911911911911912, 1.921921921921922, 1.931931931931932, 1.941941941941942, 1.951951951951952, 1.961961961961962, 1.971971971971972, 1.981981981981982, 1.991991991991992, 2.002002002002002, 2.012012012012012, 2.022022022022022, 2.032032032032032, 2.042042042042042, 2.052052052052052, 2.062062062062062, 2.0720720720720722, 2.0820820820820822, 2.0920920920920922, 2.1021021021021022, 2.1121121121121122, 2.1221221221221223, 2.1321321321321323, 2.1421421421421423, 2.1521521521521523, 2.1621621621621623, 2.1721721721721723, 2.1821821821821823, 2.1921921921921923, 2.2022022022022023, 2.2122122122122123, 2.2222222222222223, 2.2322322322322323, 2.2422422422422423, 2.2522522522522523, 2.2622622622622623, 2.2722722722722724, 2.2822822822822824, 2.2922922922922924, 2.3023023023023024, 2.3123123123123124, 2.3223223223223224, 2.3323323323323324, 2.3423423423423424, 2.3523523523523524, 2.3623623623623624, 2.3723723723723724, 2.3823823823823824, 2.3923923923923924, 2.4024024024024024, 2.4124124124124124, 2.4224224224224224, 2.4324324324324325, 2.4424424424424425, 2.4524524524524525, 2.4624624624624625, 2.4724724724724725, 2.4824824824824825, 2.4924924924924925, 2.5025025025025025, 2.5125125125125125, 2.5225225225225225, 2.5325325325325325, 2.5425425425425425, 2.5525525525525525, 2.5625625625625625, 2.5725725725725725, 2.5825825825825826, 2.5925925925925926, 2.6026026026026026, 2.6126126126126126, 2.6226226226226226, 2.6326326326326326, 2.6426426426426426, 2.6526526526526526, 2.6626626626626626, 2.6726726726726726, 2.6826826826826826, 2.6926926926926926, 2.7027027027027026, 2.7127127127127126, 2.7227227227227226, 2.7327327327327327, 2.7427427427427427, 2.7527527527527527, 2.7627627627627627, 2.7727727727727727, 2.7827827827827827, 2.7927927927927927, 2.8028028028028027, 2.8128128128128127, 2.8228228228228227, 2.8328328328328327, 2.8428428428428427, 2.8528528528528527, 2.8628628628628627, 2.8728728728728727, 2.8828828828828827, 2.8928928928928928, 2.9029029029029028, 2.9129129129129128, 2.9229229229229228, 2.932932932932933, 2.942942942942943, 2.952952952952953, 2.962962962962963, 2.972972972972973, 2.982982982982983, 2.992992992992993, 3.003003003003003, 3.013013013013013, 3.023023023023023, 3.033033033033033, 3.043043043043043, 3.053053053053053, 3.063063063063063, 3.073073073073073, 3.083083083083083, 3.093093093093093, 3.103103103103103, 3.113113113113113, 3.123123123123123, 3.133133133133133, 3.143143143143143, 3.153153153153153, 3.163163163163163, 3.173173173173173, 3.183183183183183, 3.193193193193193, 3.203203203203203, 3.2132132132132134, 3.2232232232232234, 3.2332332332332334, 3.2432432432432434, 3.2532532532532534, 3.2632632632632634, 3.2732732732732734, 3.2832832832832834, 3.2932932932932935, 3.3033033033033035, 3.3133133133133135, 3.3233233233233235, 3.3333333333333335, 3.3433433433433435, 3.3533533533533535, 3.3633633633633635, 3.3733733733733735, 3.3833833833833835, 3.3933933933933935, 3.4034034034034035, 3.4134134134134135, 3.4234234234234235, 3.4334334334334335, 3.4434434434434436, 3.4534534534534536, 3.4634634634634636, 3.4734734734734736, 3.4834834834834836, 3.4934934934934936, 3.5035035035035036, 3.5135135135135136, 3.5235235235235236, 3.5335335335335336, 3.5435435435435436, 3.5535535535535536, 3.5635635635635636, 3.5735735735735736, 3.5835835835835836, 3.5935935935935936, 3.6036036036036037, 3.6136136136136137, 3.6236236236236237, 3.6336336336336337, 3.6436436436436437, 3.6536536536536537, 3.6636636636636637, 3.6736736736736737, 3.6836836836836837, 3.6936936936936937, 3.7037037037037037, 3.7137137137137137, 3.7237237237237237, 3.7337337337337337, 3.7437437437437437, 3.7537537537537538, 3.7637637637637638, 3.7737737737737738, 3.7837837837837838, 3.793793793793794, 3.803803803803804, 3.813813813813814, 3.823823823823824, 3.833833833833834, 3.843843843843844, 3.853853853853854, 3.863863863863864, 3.873873873873874, 3.883883883883884, 3.893893893893894, 3.903903903903904, 3.913913913913914, 3.923923923923924, 3.933933933933934, 3.943943943943944, 3.953953953953954, 3.963963963963964, 3.973973973973974, 3.983983983983984, 3.993993993993994, 4.004004004004004, 4.014014014014014, 4.024024024024024, 4.034034034034034, 4.044044044044044, 4.054054054054054, 4.064064064064064, 4.074074074074074, 4.084084084084084, 4.094094094094094, 4.104104104104104, 4.114114114114114, 4.124124124124124, 4.134134134134134, 4.1441441441441444, 4.1541541541541545, 4.1641641641641645, 4.1741741741741745, 4.1841841841841845, 4.1941941941941945, 4.2042042042042045, 4.2142142142142145, 4.2242242242242245, 4.2342342342342345, 4.2442442442442445, 4.2542542542542545, 4.2642642642642645, 4.2742742742742745, 4.2842842842842845, 4.2942942942942945, 4.3043043043043046, 4.314314314314315, 4.324324324324325, 4.334334334334335, 4.344344344344345, 4.354354354354355, 4.364364364364365, 4.374374374374375, 4.384384384384385, 4.394394394394395, 4.404404404404405, 4.414414414414415, 4.424424424424425, 4.434434434434435, 4.444444444444445, 4.454454454454455, 4.464464464464465, 4.474474474474475, 4.484484484484485, 4.494494494494495, 4.504504504504505, 4.514514514514515, 4.524524524524525, 4.534534534534535, 4.544544544544545, 4.554554554554555, 4.564564564564565, 4.574574574574575, 4.584584584584585, 4.594594594594595, 4.604604604604605, 4.614614614614615, 4.624624624624625, 4.634634634634635, 4.644644644644645, 4.654654654654655, 4.664664664664665, 4.674674674674675, 4.684684684684685, 4.694694694694695, 4.704704704704705, 4.714714714714715, 4.724724724724725, 4.734734734734735, 4.744744744744745, 4.754754754754755, 4.764764764764765, 4.774774774774775, 4.784784784784785, 4.794794794794795, 4.804804804804805, 4.814814814814815, 4.824824824824825, 4.834834834834835, 4.844844844844845, 4.854854854854855, 4.864864864864865, 4.874874874874875, 4.884884884884885, 4.894894894894895, 4.904904904904905, 4.914914914914915, 4.924924924924925, 4.934934934934935, 4.944944944944945, 4.954954954954955, 4.964964964964965, 4.974974974974975, 4.984984984984985, 4.994994994994995, 5.005005005005005, 5.015015015015015, 5.025025025025025, 5.035035035035035, 5.045045045045045, 5.055055055055055, 5.065065065065065, 5.075075075075075, 5.085085085085085, 5.095095095095095, 5.105105105105105, 5.115115115115115, 5.125125125125125, 5.135135135135135, 5.145145145145145, 5.155155155155155, 5.165165165165165, 5.175175175175175, 5.185185185185185, 5.195195195195195, 5.205205205205205, 5.215215215215215, 5.225225225225225, 5.235235235235235, 5.245245245245245, 5.255255255255255, 5.265265265265265, 5.275275275275275, 5.285285285285285, 5.295295295295295, 5.305305305305305, 5.315315315315315, 5.325325325325325, 5.335335335335335, 5.345345345345345, 5.355355355355355, 5.365365365365365, 5.375375375375375, 5.385385385385385, 5.395395395395395, 5.405405405405405, 5.415415415415415, 5.425425425425425, 5.435435435435435, 5.445445445445445, 5.455455455455455, 5.465465465465465, 5.475475475475475, 5.485485485485485, 5.495495495495495, 5.505505505505505, 5.515515515515515, 5.525525525525525, 5.535535535535535, 5.545545545545545, 5.555555555555555, 5.565565565565565, 5.575575575575575, 5.585585585585585, 5.595595595595595, 5.605605605605605, 5.615615615615615, 5.625625625625625, 5.635635635635635, 5.645645645645645, 5.655655655655655, 5.665665665665665, 5.675675675675675, 5.685685685685685, 5.6956956956956954, 5.7057057057057055, 5.7157157157157155, 5.7257257257257255, 5.7357357357357355, 5.7457457457457455, 5.7557557557557555, 5.7657657657657655, 5.7757757757757755, 5.7857857857857855, 5.7957957957957955, 5.8058058058058055, 5.8158158158158155, 5.8258258258258255, 5.8358358358358355, 5.8458458458458455, 5.8558558558558556, 5.865865865865866, 5.875875875875876, 5.885885885885886, 5.895895895895896, 5.905905905905906, 5.915915915915916, 5.925925925925926, 5.935935935935936, 5.945945945945946, 5.955955955955956, 5.965965965965966, 5.975975975975976, 5.985985985985986, 5.995995995995996, 6.006006006006006, 6.016016016016016, 6.026026026026026, 6.036036036036036, 6.046046046046046, 6.056056056056056, 6.066066066066066, 6.076076076076076, 6.086086086086086, 6.096096096096096, 6.106106106106106, 6.116116116116116, 6.126126126126126, 6.136136136136136, 6.146146146146146, 6.156156156156156, 6.166166166166166, 6.176176176176176, 6.186186186186186, 6.196196196196196, 6.206206206206206, 6.216216216216216, 6.226226226226226, 6.236236236236236, 6.246246246246246, 6.256256256256256, 6.266266266266266, 6.276276276276276, 6.286286286286286, 6.296296296296296, 6.306306306306306, 6.316316316316316, 6.326326326326326, 6.336336336336336, 6.346346346346346, 6.356356356356356, 6.366366366366366, 6.376376376376376, 6.386386386386386, 6.396396396396396, 6.406406406406406, 6.416416416416417, 6.426426426426427, 6.436436436436437, 6.446446446446447, 6.456456456456457, 6.466466466466467, 6.476476476476477, 6.486486486486487, 6.496496496496497, 6.506506506506507, 6.516516516516517, 6.526526526526527, 6.536536536536537, 6.546546546546547, 6.556556556556557, 6.566566566566567, 6.576576576576577, 6.586586586586587, 6.596596596596597, 6.606606606606607, 6.616616616616617, 6.626626626626627, 6.636636636636637, 6.646646646646647, 6.656656656656657, 6.666666666666667, 6.676676676676677, 6.686686686686687, 6.696696696696697, 6.706706706706707, 6.716716716716717, 6.726726726726727, 6.736736736736737, 6.746746746746747, 6.756756756756757, 6.766766766766767, 6.776776776776777, 6.786786786786787, 6.796796796796797, 6.806806806806807, 6.816816816816817, 6.826826826826827, 6.836836836836837, 6.846846846846847, 6.856856856856857, 6.866866866866867, 6.876876876876877, 6.886886886886887, 6.896896896896897, 6.906906906906907, 6.916916916916917, 6.926926926926927, 6.936936936936937, 6.946946946946947, 6.956956956956957, 6.966966966966967, 6.976976976976977, 6.986986986986987, 6.996996996996997, 7.007007007007007, 7.017017017017017, 7.027027027027027, 7.037037037037037, 7.047047047047047, 7.057057057057057, 7.067067067067067, 7.077077077077077, 7.087087087087087, 7.097097097097097, 7.107107107107107, 7.117117117117117, 7.127127127127127, 7.137137137137137, 7.147147147147147, 7.157157157157157, 7.167167167167167, 7.177177177177177, 7.187187187187187, 7.197197197197197, 7.207207207207207, 7.217217217217217, 7.227227227227227, 7.237237237237237, 7.247247247247247, 7.257257257257257, 7.267267267267267, 7.277277277277277, 7.287287287287287, 7.297297297297297, 7.307307307307307, 7.317317317317317, 7.327327327327327, 7.337337337337337, 7.347347347347347, 7.357357357357357, 7.367367367367367, 7.377377377377377, 7.387387387387387, 7.397397397397397, 7.407407407407407, 7.4174174174174174, 7.4274274274274275, 7.4374374374374375, 7.4474474474474475, 7.4574574574574575, 7.4674674674674675, 7.4774774774774775, 7.4874874874874875, 7.4974974974974975, 7.5075075075075075, 7.5175175175175175, 7.5275275275275275, 7.5375375375375375, 7.5475475475475475, 7.5575575575575575, 7.5675675675675675, 7.5775775775775776, 7.587587587587588, 7.597597597597598, 7.607607607607608, 7.617617617617618, 7.627627627627628, 7.637637637637638, 7.647647647647648, 7.657657657657658, 7.667667667667668, 7.677677677677678, 7.687687687687688, 7.697697697697698, 7.707707707707708, 7.717717717717718, 7.727727727727728, 7.737737737737738, 7.747747747747748, 7.757757757757758, 7.767767767767768, 7.777777777777778, 7.787787787787788, 7.797797797797798, 7.807807807807808, 7.817817817817818, 7.827827827827828, 7.837837837837838, 7.847847847847848, 7.857857857857858, 7.867867867867868, 7.877877877877878, 7.887887887887888, 7.897897897897898, 7.907907907907908, 7.917917917917918, 7.927927927927928, 7.937937937937938, 7.947947947947948, 7.957957957957958, 7.967967967967968, 7.977977977977978, 7.987987987987988, 7.997997997997998, 8.008008008008009, 8.018018018018019, 8.028028028028029, 8.038038038038039, 8.048048048048049, 8.058058058058059, 8.068068068068069, 8.078078078078079, 8.088088088088089, 8.098098098098099, 8.108108108108109, 8.118118118118119, 8.128128128128129, 8.138138138138139, 8.148148148148149, 8.158158158158159, 8.168168168168169, 8.178178178178179, 8.188188188188189, 8.198198198198199, 8.208208208208209, 8.218218218218219, 8.228228228228229, 8.238238238238239, 8.248248248248249, 8.258258258258259, 8.268268268268269, 8.278278278278279, 8.288288288288289, 8.298298298298299, 8.308308308308309, 8.318318318318319, 8.328328328328329, 8.338338338338339, 8.348348348348349, 8.358358358358359, 8.368368368368369, 8.378378378378379, 8.388388388388389, 8.398398398398399, 8.408408408408409, 8.418418418418419, 8.428428428428429, 8.438438438438439, 8.448448448448449, 8.458458458458459, 8.468468468468469, 8.478478478478479, 8.488488488488489, 8.498498498498499, 8.508508508508509, 8.518518518518519, 8.528528528528529, 8.538538538538539, 8.548548548548549, 8.558558558558559, 8.568568568568569, 8.578578578578579, 8.588588588588589, 8.598598598598599, 8.608608608608609, 8.618618618618619, 8.62862862862863, 8.63863863863864, 8.64864864864865, 8.65865865865866, 8.66866866866867, 8.67867867867868, 8.68868868868869, 8.6986986986987, 8.70870870870871, 8.71871871871872, 8.72872872872873, 8.73873873873874, 8.74874874874875, 8.75875875875876, 8.76876876876877, 8.77877877877878, 8.78878878878879, 8.7987987987988, 8.80880880880881, 8.81881881881882, 8.82882882882883, 8.83883883883884, 8.84884884884885, 8.85885885885886, 8.86886886886887, 8.87887887887888, 8.88888888888889, 8.8988988988989, 8.90890890890891, 8.91891891891892, 8.92892892892893, 8.93893893893894, 8.94894894894895, 8.95895895895896, 8.96896896896897, 8.97897897897898, 8.98898898898899, 8.998998998999, 9.00900900900901, 9.01901901901902, 9.02902902902903, 9.03903903903904, 9.04904904904905, 9.05905905905906, 9.06906906906907, 9.07907907907908, 9.08908908908909, 9.0990990990991, 9.10910910910911, 9.11911911911912, 9.12912912912913, 9.13913913913914, 9.14914914914915, 9.15915915915916, 9.16916916916917, 9.17917917917918, 9.18918918918919, 9.1991991991992, 9.20920920920921, 9.21921921921922, 9.22922922922923, 9.23923923923924, 9.24924924924925, 9.25925925925926, 9.26926926926927, 9.27927927927928, 9.28928928928929, 9.2992992992993, 9.30930930930931, 9.31931931931932, 9.32932932932933, 9.33933933933934, 9.34934934934935, 9.35935935935936, 9.36936936936937, 9.37937937937938, 9.38938938938939, 9.3993993993994, 9.40940940940941, 9.41941941941942, 9.42942942942943, 9.43943943943944, 9.44944944944945, 9.45945945945946, 9.46946946946947, 9.47947947947948, 9.48948948948949, 9.4994994994995, 9.50950950950951, 9.51951951951952, 9.52952952952953, 9.53953953953954, 9.54954954954955, 9.55955955955956, 9.56956956956957, 9.57957957957958, 9.58958958958959, 9.5995995995996, 9.60960960960961, 9.61961961961962, 9.62962962962963, 9.63963963963964, 9.64964964964965, 9.65965965965966, 9.66966966966967, 9.67967967967968, 9.68968968968969, 9.6996996996997, 9.70970970970971, 9.71971971971972, 9.72972972972973, 9.73973973973974, 9.74974974974975, 9.75975975975976, 9.76976976976977, 9.77977977977978, 9.78978978978979, 9.7997997997998, 9.80980980980981, 9.81981981981982, 9.82982982982983, 9.83983983983984, 9.84984984984985, 9.85985985985986, 9.86986986986987, 9.87987987987988, 9.88988988988989, 9.8998998998999, 9.90990990990991, 9.91991991991992, 9.92992992992993, 9.93993993993994, 9.94994994994995, 9.95995995995996, 9.96996996996997, 9.97997997997998, 9.98998998998999, 10.0], "y": [-0.4499999999999993, -0.40197570822452544, -0.3541987635987045, -0.30666861772900145, -0.2593847222218848, -0.21234652868381065, -0.16555348872125109, -0.11900505394066752, -0.07270067594851781, -0.026639806351274053, 0.019178103244600564, 0.06475360123264817, 0.1100872360063967, 0.15517955595939004, 0.2000311094851579, 0.24464244497723886, 0.28901411082916795, 0.33314665543448196, 0.3770406271867195, 0.420696574479412, 0.4641150457060945, 0.5072965892603092, 0.5502417535355892, 0.5929510869254679, 0.6354251378234821, 0.6776644546231685, 0.7196695857180657, 0.7614410795017088, 0.8029794843676292, 0.8442853487093656, 0.8853592209204582, 0.9262016493944385, 0.9668131825248398, 1.0071943687052043, 1.0473457563290633, 1.0872678937899556, 1.1269613294814178, 1.1664266117969815, 1.2056642891301887, 1.2446749098745693, 1.28345902242366, 1.3220171751710064, 1.3603499165101294, 1.3984577948345782, 1.4363413585378773, 1.474001156013574, 1.511437735655198, 1.5486516458562827, 1.5856434350103683, 1.6224136515109917, 1.6589628437516861, 1.6952915601259901, 1.7314003490274352, 1.7672897588495617, 1.802960337985903, 1.8384126348299974, 1.873647197775382, 1.9086645752155844, 1.9434653155441506, 1.978049967154612, 2.012419078440505, 2.046573197795367, 2.0805128736127294, 2.114238654286134, 2.1477510882091124, 2.1810507237752033, 2.214138109377945, 2.247013793410866, 2.2796783242675076, 2.3121322503414037, 2.3443761200260926, 2.3764104817151095, 2.4082358838019893, 2.439852874680269, 2.471262002743483, 2.502463816385169, 2.533458863998863, 2.564247693978099, 2.594830854716415, 2.6252088946073444, 2.6553823620444277, 2.6853518054211998, 2.715117773131192, 2.7446808135679435, 2.774041475124992, 2.80320030619587, 2.832157855174115, 2.860914670453263, 2.8894713004268517, 2.917828293488415, 2.945986198031486, 2.9739455624496083, 3.00170693513631, 3.0292708644851336, 3.056637898889612, 3.0838085867432783, 3.110783476439675, 3.1375631163723314, 3.1641480549347865, 3.1905388405205803, 3.216736021523241, 3.2427401463363106, 3.2685517633533214, 3.294171420967812, 3.3195996675733177, 3.344837051563373, 3.3698841213315145, 3.3947414252712784, 3.419409511776202, 3.4438889292398223, 3.4681802260556696, 3.492283950617284, 3.5162006513182025, 3.5399308765519573, 3.5634751747120887, 3.5868340941921275, 3.610008183385615, 3.6329979906860848, 3.6558040644870715, 3.6784269531821154, 3.700867205164746, 3.723125368828505, 3.745201992566926, 3.7670976247735437, 3.788812813841899, 3.8103481081655195, 3.8317040561379496, 3.8528812061527216, 3.8738801066033695, 3.8947013058834346, 3.9153453523864474, 3.9358127945059476, 3.956104180635469, 3.976220059168548, 3.9961609784987235, 4.015927487019525, 4.0355201331244945, 4.054939465207165, 4.074186031661074, 4.093260380879759, 4.11216306125675, 4.130894621185589, 4.1494556090598085, 4.1678465732729455, 4.186068062218538, 4.204120624290119, 4.222004807881225, 4.239721161385394, 4.2572702331961585, 4.27465257170706, 4.291868725311627, 4.308919242403403, 4.32580467137592, 4.342525560622712, 4.359082458537319, 4.375475913513275, 4.391706473944117, 4.407774688223381, 4.4236811047446, 4.439426271901316, 4.455010738087058, 4.4704350516953655, 4.485699761119776, 4.500805414753822, 4.515752560991042, 4.530541748224972, 4.545173524849146, 4.559648439257103, 4.573967039842374, 4.5881298749985, 4.602137493119015, 4.615990442597454, 4.629689271827357, 4.643234529202253, 4.656626763115685, 4.669866521961184, 4.6829543541322876, 4.695890808022535, 4.708676432025457, 4.721311774534592, 4.733797383943476, 4.746133808645644, 4.7583215970346355, 4.770361297503981, 4.78225345844722, 4.793998628257887, 4.805597355329519, 4.817050188055653, 4.828357674829822, 4.839520364045565, 4.8505388040964155, 4.86141354337591, 4.8721451302775876, 4.882734113194979, 4.8931810405216245, 4.903486460651058, 4.9136509219768145, 4.923674972892433, 4.933559161791448, 4.943304037067395, 4.95291014711381, 4.96237804032423, 4.97170826509219, 4.980901369811225, 4.989957902874874, 4.99887841267667, 5.007663447610151, 5.0163135560688525, 5.02482928644631, 5.033211187136057, 5.041459806531635, 5.0495756930265765, 5.057559395014418, 5.065411460888695, 5.073132439042944, 5.080722877870701, 5.088183325765503, 5.095514331120884, 5.102716442330381, 5.10979020778753, 5.116736175885866, 5.123554895018927, 5.1302469135802475, 5.136812779963363, 5.1432530425618115, 5.149568249769126, 5.155758949978845, 5.161825691584505, 5.167769022979639, 5.173589492557785, 5.179287648712479, 5.184864039837256, 5.190319214325654, 5.1956537205712054, 5.20086810696745, 5.20596292190792, 5.210938713786156, 5.215796030995691, 5.22053542193006, 5.225157434982801, 5.229662618547449, 5.234051521017541, 5.238324690786612, 5.242482676248199, 5.246526025795837, 5.25045528782306, 5.254271010723409, 5.257973742890417, 5.261564032717619, 5.265042428598553, 5.268409478926753, 5.271665732095757, 5.2748117364991, 5.277848040530317, 5.280775192582946, 5.283593741050522, 5.286304234326581, 5.288907220804659, 5.291403248878291, 5.293792866941016, 5.296076623386365, 5.298255066607879, 5.300328744999091, 5.302298206953537, 5.304164000864756, 5.30592667512628, 5.307586778131647, 5.309144858274394, 5.310601463948053, 5.311957143546165, 5.313212445462263, 5.314367918089884, 5.315424109822563, 5.316381569053837, 5.31724084417724, 5.3180024835863104, 5.318667035674583, 5.319235048835596, 5.319707071462881, 5.320083651949978, 5.32036533869042, 5.320552680077746, 5.32064622450549, 5.320646520367187, 5.320554116056375, 5.320369559966588, 5.320093400491366, 5.3197261860242415, 5.31926846495875, 5.318720785688429, 5.318083696606815, 5.317357746107442, 5.316543482583848, 5.315641454429566, 5.314652210038137, 5.313576297803092, 5.312414266117971, 5.311166663376306, 5.309834037971635, 5.308416938297494, 5.306915912747419, 5.3053315097149465, 5.3036642775936125, 5.301914764776951, 5.3000835196585, 5.2981710906317945, 5.296178026090371, 5.294104874427765, 5.2919521840375126, 5.28972050331315, 5.287410380648213, 5.285022364436238, 5.28255700307076, 5.280014844945315, 5.277396438453442, 5.2747023319886726, 5.271933073944545, 5.269089212714595, 5.266171296692359, 5.263179874271372, 5.260115493845169, 5.256978703807288, 5.253770052551266, 5.250490088470636, 5.247139359958936, 5.243718415409702, 5.240227803216467, 5.236668071772771, 5.233039769472148, 5.2293434447081335, 5.225579645874265, 5.221748921364076, 5.2178518195711066, 5.213888888888889, 5.20986067771096, 5.205767734430857, 5.201610607442114, 5.197389845138268, 5.193105995912857, 5.188759608159413, 5.184351230271474, 5.179881410642576, 5.175350697666256, 5.170759639736049, 5.166108785245489, 5.161398682588115, 5.156629880157462, 5.151802926347065, 5.146918369550461, 5.141976758161185, 5.136978640572775, 5.131924565178765, 5.126815080372692, 5.1216507345480915, 5.116432076098499, 5.1111596534174515, 5.105834014898484, 5.100455708935135, 5.095025283920936, 5.089543288249427, 5.084010270314142, 5.078426778508616, 5.072793361226388, 5.067110566860993, 5.061378943805965, 5.055599040454842, 5.049771405201159, 5.043896586438453, 5.037975132560259, 5.032007591960111, 5.02599451303155, 5.019936444168108, 5.013833933763323, 5.007687530210729, 5.001497781903863, 4.9952652372362625, 4.988990444601461, 4.982673952392995, 4.976316309004403, 4.969918062829217, 4.963479762260976, 4.957001955693214, 4.950485191519469, 4.943930018133276, 4.937336983928169, 4.930706637297687, 4.924039526635365, 4.91733620033474, 4.910597206789345, 4.903823094392718, 4.897014411538395, 4.8901717066199115, 4.8832955280308035, 4.8763864241646075, 4.869444943414859, 4.862471634175094, 4.855467044838848, 4.848431723799659, 4.841366219451061, 4.83427108018659, 4.827146854399783, 4.819994090484174, 4.812813336833303, 4.805605141840702, 4.798370053899907, 4.791108621404458, 4.783821392747885, 4.77650891632373, 4.769171740525527, 4.761810413746809, 4.754425484381116, 4.747017500821981, 4.739587011462941, 4.7321345646975335, 4.724660708919291, 4.717165992521753, 4.709650963898454, 4.70211617144293, 4.694562163548718, 4.686989488609351, 4.679398695018369, 4.671790331169304, 4.6641649454556955, 4.656523086271077, 4.648865302008986, 4.641192141062957, 4.633504151826528, 4.625801882693233, 4.61808588205661, 4.610356698310191, 4.602614879847518, 4.594860975062121, 4.587095532347541, 4.579319100097311, 4.571532226704966, 4.563735460564045, 4.555929350068082, 4.548114443610615, 4.540291289585178, 4.532460436385307, 4.524622432404539, 4.516777826036408, 4.508927165674454, 4.5010709997122085, 4.49320987654321, 4.4853443445609935, 4.4774749521590955, 4.469602247731052, 4.461726779670398, 4.453849096370671, 4.4459697462254075, 4.438089277628141, 4.4302082389724085, 4.422327178651747, 4.414446645059691, 4.406567186589776, 4.39868935163554, 4.390813688590519, 4.382940745848248, 4.375071071802262, 4.3672052148460985, 4.359343723373293, 4.351487145777382, 4.343636030451901, 4.335790925790384, 4.327952380186371, 4.320120942033395, 4.312297159724992, 4.3044815816547, 4.296674756216054, 4.288877231802589, 4.281089556807842, 4.273312279625349, 4.265545948648645, 4.257791112271267, 4.250048318886752, 4.242318116888633, 4.234601054670448, 4.226897680625733, 4.219208543148023, 4.211534190630855, 4.203875171467763, 4.1962320340522865, 4.188605326777959, 4.180995598038316, 4.173403396226895, 4.16582926973723, 4.158273766962861, 4.15073743629732, 4.143220826134144, 4.13572448486687, 4.128248960889033, 4.120794802594169, 4.113362558375815, 4.105952776627505, 4.098566005742777, 4.091202794115166, 4.083863690138207, 4.076549242205439, 4.069259998710395, 4.0619965080466125, 4.054759318607626, 4.047548978786973, 4.04036603697819, 4.033211041574811, 4.0260845409703725, 4.018987083558412, 4.011919217732463, 4.004881491886064, 3.9978744544127496, 3.9908986537060556, 3.983954638159519, 3.9770429561666742, 3.9701641561210588, 3.9633187864162083, 3.956507395445658, 3.9497305316029445, 3.9429887432816035, 3.9362825788751716, 3.9296125867771843, 3.9229793153811765, 3.916383313080686, 3.909825128269248, 3.903305309340398, 3.896824404687673, 3.8903829627046087, 3.8839815317847406, 3.8776206603216044, 3.8713008967087372, 3.865022789339674, 3.8587868866079513, 3.852593736907105, 3.8464438886306698, 3.840337890172184, 3.834276289925182, 3.8282596362832004, 3.822288477639775, 3.8163633623884414, 3.8104848389227364, 3.8046534556361955, 3.7988697609223543, 3.793134303174749, 3.7874476307869163, 3.7818102921523913, 3.77622283566471, 3.770685809717409, 3.765199762704024, 3.759765243018091, 3.754382799053145, 3.7490529792027236, 3.743776331860362, 3.738553405419596, 3.733384748273962, 3.728270908816995, 3.723212435442233, 3.7182098765432103, 3.713263780513463, 3.708374695746527, 3.703543170635939, 3.6987697535752346, 3.69405499295795, 3.689399437177621, 3.684803634627783, 3.680268133701973, 3.6757934827937264, 3.6713802302965792, 3.6670289246040677, 3.662740114109727, 3.658514347207094, 3.6543521722897045, 3.6502541377510944, 3.6462207919847995, 3.6422526833843554, 3.638350360343299, 3.6345143712551664, 3.630745264513492, 3.627043588511814, 3.623409891643666, 3.6198447223025854, 3.6163486288821085, 3.6129221597757706, 3.6095658633771075, 3.606280288079655, 3.60306598227695, 3.5999234943625282, 3.5968533727299254, 3.593856165772677, 3.59093242188432, 3.5880826894583895, 3.585307516888422, 3.582607452567954, 3.57998304489052, 3.577434842249657, 3.5749633930389013, 3.572569245651788, 3.570252948481853, 3.5680150499226335, 3.565856098367664, 3.5637766422104815, 3.5617772298446218, 3.5598584096636205, 3.5580207300610143, 3.556264739430337, 3.5545909861651284, 3.553000018658921, 3.5514923853052522, 3.5500686344976584, 3.548729314629674, 3.547474974094838, 3.546306161286682, 3.545223424598746, 3.5442273124245642, 3.5433183731576716, 3.542497155191607, 3.541764206919903, 3.5411200767360986, 3.540565313033728, 3.5401004642063265, 3.539726078647433, 3.53944270475058, 3.5392508909093063, 3.5391511855171465, 3.5391441369676357, 3.5392302936543127, 3.5394102039707103, 3.539684416310367, 3.5400534790668168, 3.5405179406335967, 3.5410783494042435, 3.5417352537722904, 3.542489202131277, 3.5433407428747365, 3.5442904243962055, 3.5453387950892217, 3.546486403347318, 3.5477337975640335, 3.5490815261329023, 3.55053013744746, 3.5520801799012447, 3.55373220188779, 3.5554867518006334, 3.55734437803331, 3.5593056289793568, 3.5613710530323095, 3.563541198585703, 3.5658166140330745, 3.568197847767958, 3.5706854481838928, 3.5732799636744126, 3.5759819426330535, 3.5787919334533522, 3.581710484528843, 3.584738144253065, 3.587875461019552, 3.591122983221839, 3.5944812592534645, 3.597950837507962, 3.60153226637887, 3.605226094259723, 3.6090328695440563, 3.6129531406254083, 3.6169874558973114, 3.621136363753305, 3.625400412586924, 3.629780150791703, 3.63427612676118, 3.6388888888888884, 3.6436189855683674, 3.648466965193151, 3.6534333761567757, 3.658518766852777, 3.6637236856746904, 3.669048681016054, 3.6744943012704026, 3.6800610948312706, 3.6857496100921967, 3.6915603954467135, 3.6974939992883615, 3.7035509700106735, 3.7097318560071857, 3.7160372056714355, 3.722467567396956, 3.729023489577287, 3.735705520605962, 3.7425142088765173, 3.74945010278249, 3.7565137507174136, 3.7637057010748274, 3.771026502248265, 3.778476702631263, 3.7860568506173577, 3.7937674946000834, 3.8016091829729795, 3.80958246412958, 3.817687886463419, 3.8259259983680356, 3.834297348236963, 3.8428024844637405, 3.8514419554419024, 3.860216309564983, 3.869126095226521, 3.878171860820049, 3.8873541547391075, 3.89667352537723, 3.9061305211279516, 3.91572569038481, 3.9254595815413387, 3.9353327429910774, 3.94534572312756, 3.955499070344321, 3.9657933330349, 3.9762290595928285, 3.9868067984116466, 3.9975270978848894, 4.00839050640609, 4.019397572368788, 4.030548844166516, 4.041844870192813, 4.0532861988412145, 4.064873378505254, 4.076606957578471, 4.088487484454397, 4.100515507526573, 4.112691575188533, 4.12501623583381, 4.137490037855946, 4.150113529648469, 4.162887259604924, 4.175811776118841, 4.188887627583756, 4.202115362393209, 4.215495528940731, 4.229028675619862, 4.242715350824136, 4.256556102947089, 4.270551480382258, 4.284702031523176, 4.299008304763384, 4.313470848496416, 4.328090211115804, 4.34286694101509, 4.357801586587804, 4.372894696227489, 4.388146818327677, 4.4035585012819, 4.419130293483701, 4.434862743326612, 4.450756399204172, 4.466811809509914, 4.4830295226373735, 4.49941008698009, 4.515954050931596, 4.53266196288543, 4.5495343712351275, 4.566571824374221, 4.583774870696253, 4.601144058594752, 4.618679936463261, 4.6363830526953125, 4.654253955684441, 4.672293193824185, 4.690501315508078, 4.70887886912966, 4.7274264030824655, 4.746144465760027, 4.765033605555885, 4.7840943708635715, 4.803327310076627, 4.822732971588585, 4.842311903792979, 4.8620646550833495, 4.881991773853229, 4.902093808496157, 4.922371307405667, 4.942824818975293, 4.963454891598577, 4.984262073669047, 5.005246913580247, 5.026409959725709, 5.047751760498967, 5.069272864293561, 5.090973819503023, 5.112855174520894, 5.134917477740707, 5.157161277555996, 5.179587122360301, 5.202195560547154, 5.224987140510096, 5.247962410642659, 5.271121919338379, 5.294466214990795, 5.317995845993437, 5.341711360739849, 5.365613307623562, 5.3897022350381105, 5.413978691377036, 5.438443225033868, 5.463096384402149, 5.487938717875412, 5.51297077384719, 5.538193100711028, 5.563606246860449, 5.589210760688998, 5.615007190590211, 5.6409960849576155, 5.66717799218476, 5.693553460665168, 5.720123038792387, 5.746887274959946, 5.77384671756138, 5.8010019149902305, 5.828353415640027, 5.855901767904314, 5.883647520176621, 5.911591220850482, 5.93973341831944, 5.968074660977024, 5.996615497216777, 6.025356475432231, 6.054298144016919, 6.0834410513643835, 6.112785745868154, 6.142332775921773, 6.172082689918773, 6.202036036252687, 6.232193363317059, 6.262555219505414, 6.293122153211299, 6.3238947128282454, 6.354873446749785, 6.386058903369461, 6.417451631080802, 6.449052178277352, 6.480861093352644, 6.512878924700208, 6.545106220713588, 6.577543529786314, 6.610191400311929, 6.643050380683965, 6.676121019295953, 6.7094038645414384, 6.742899464813948, 6.776608368507026, 6.810531124014206, 6.844668279729018, 6.879020384045007, 6.9135879853557, 6.948371632054642, 6.983371872535366, 7.0185892551914, 7.054024328416292, 7.0896776406035675, 7.125549740146773, 7.161641175439438, 7.197952494875095, 7.234484246847289, 7.271236979749547, 7.308211241975414, 7.345407581918422, 7.3828265479721, 7.420468688529997, 7.4583345519856366, 7.496424686732565, 7.534739641164314, 7.5732799636744135, 7.612046202656411, 7.651038906503833, 7.690258623610221, 7.72970590236911, 7.769381291174032, 7.809285338418529, 7.84941859249613, 7.8897816018003795, 7.93037491472481, 7.97119907966295, 8.012254645008348, 8.053542159154528, 8.095062170495039, 8.136815227423408, 8.178801878333168, 8.221022671617865, 8.263478155671026, 8.306168878886194, 8.349095389656902, 8.39225823637668, 8.435657967439077, 8.479295131237615, 8.523170276165843, 8.567283950617284, 8.611636702985486, 8.65622908166398, 8.701061635046301, 8.746134911525985, 8.79144945949656, 8.83700582735158, 8.882804563484571, 8.92884621628907, 8.975131334158613, 9.021660465486724, 9.06843415866696, 9.11545296209285, 9.162717424157925, 9.210228093255726, 9.25798551777977, 9.305990246123624, 9.35424282668081, 9.40274380784486, 9.451493738009317, 9.500493165567697, 9.549742638913568, 9.59924270644045, 9.648993916541876, 9.698996817611388, 9.749251958042505, 9.799759886228788, 9.850521150563765, 9.901536299440966, 9.952805881253932, 10.004330444396185, 10.056110537261285, 10.108146708242753, 10.16043950573413, 10.21298947812895, 10.265797173820737, 10.31886314120305, 10.372187928669415, 10.425772084613364, 10.479616157428438, 10.533720695508158, 10.588086247246084, 10.642713361035742, 10.697602585270664, 10.752754468344392, 10.808169558650444, 10.86384840458238, 10.919791554533735, 10.975999556898028, 11.032472960068809, 11.089212312439592, 11.146218162403946, 11.203491058355386, 11.261031548687454, 11.318840181793686, 11.3769175060676, 11.435264069902765, 11.493880421692698, 11.552767109830938, 11.61192468271102, 11.671353688726466, 11.731054676270839, 11.791028193737663, 11.851274789520474, 11.911795012012806, 11.972589409608183, 12.033658530700169, 12.095002923682282, 12.156623136948063, 12.218519718891047, 12.280693217904751, 12.343144182382746, 12.40587316071855, 12.4688807013057, 12.532167352537734, 12.595733662808168, 12.65958018051057, 12.723707454038465, 12.788116031785382, 12.852806462144864, 12.917779293510428, 12.98303507427564, 13.04857435283402, 13.11439767757911, 13.180505596904439, 13.246898659203529, 13.313577412869947, 13.38054240629722, 13.447794187878873, 13.51533330600845, 13.583160309079467, 13.651275745485496, 13.719680163620051, 13.788374111876674, 13.857358138648898, 13.926632792330244, 13.996198621314278, 14.066056173994522, 14.13620599876451, 14.20664864401778, 14.277384658147852, 14.348414589548291, 14.419738986612622, 14.491358397734375, 14.56327337130709, 14.635484455724287, 14.707992199379532, 14.780797150666347, 14.85389985797827, 14.927300869708828, 15.001000734251551, 15.075]}], "layout": {"height": 600, "template": {"data": {"bar": [{"error_x": {"color": "#2a3f5f"}, "error_y": {"color": "#2a3f5f"}, "marker": {"line": {"color": "#E5ECF6", "width": 0.5}}, "type": "bar"}], "barpolar": [{"marker": {"line": {"color": "#E5ECF6", "width": 0.5}}, "type": "barpolar"}], "carpet": [{"aaxis": {"endlinecolor": "#2a3f5f", "gridcolor": "white", "linecolor": "white", "minorgridcolor": "white", "startlinecolor": "#2a3f5f"}, "baxis": {"endlinecolor": "#2a3f5f", "gridcolor": "white", "linecolor": "white", "minorgridcolor": "white", "startlinecolor": "#2a3f5f"}, "type": "carpet"}], "choropleth": [{"colorbar": {"outlinewidth": 0, "ticks": ""}, "type": "choropleth"}], "contour": [{"colorbar": {"outlinewidth": 0, "ticks": ""}, "colorscale": [[0.0, "#0d0887"], [0.1111111111111111, "#46039f"], [0.2222222222222222, "#7201a8"], [0.3333333333333333, "#9c179e"], [0.4444444444444444, "#bd3786"], [0.5555555555555556, "#d8576b"], [0.6666666666666666, "#ed7953"], [0.7777777777777778, "#fb9f3a"], [0.8888888888888888, "#fdca26"], [1.0, "#f0f921"]], "type": "contour"}], "contourcarpet": [{"colorbar": {"outlinewidth": 0, "ticks": ""}, "type": "contourcarpet"}], "heatmap": [{"colorbar": {"outlinewidth": 0, "ticks": ""}, "colorscale": [[0.0, "#0d0887"], [0.1111111111111111, "#46039f"], [0.2222222222222222, "#7201a8"], [0.3333333333333333, "#9c179e"], [0.4444444444444444, "#bd3786"], [0.5555555555555556, "#d8576b"], [0.6666666666666666, "#ed7953"], [0.7777777777777778, "#fb9f3a"], [0.8888888888888888, "#fdca26"], [1.0, "#f0f921"]], "type": "heatmap"}], "heatmapgl": [{"colorbar": {"outlinewidth": 0, "ticks": ""}, "colorscale": [[0.0, "#0d0887"], [0.1111111111111111, "#46039f"], [0.2222222222222222, "#7201a8"], [0.3333333333333333, "#9c179e"], [0.4444444444444444, "#bd3786"], [0.5555555555555556, "#d8576b"], [0.6666666666666666, "#ed7953"], [0.7777777777777778, "#fb9f3a"], [0.8888888888888888, "#fdca26"], [1.0, "#f0f921"]], "type": "heatmapgl"}], "histogram": [{"marker": {"colorbar": {"outlinewidth": 0, "ticks": ""}}, "type": "histogram"}], "histogram2d": [{"colorbar": {"outlinewidth": 0, "ticks": ""}, "colorscale": [[0.0, "#0d0887"], [0.1111111111111111, "#46039f"], [0.2222222222222222, "#7201a8"], [0.3333333333333333, "#9c179e"], [0.4444444444444444, "#bd3786"], [0.5555555555555556, "#d8576b"], [0.6666666666666666, "#ed7953"], [0.7777777777777778, "#fb9f3a"], [0.8888888888888888, "#fdca26"], [1.0, "#f0f921"]], "type": "histogram2d"}], "histogram2dcontour": [{"colorbar": {"outlinewidth": 0, "ticks": ""}, "colorscale": [[0.0, "#0d0887"], [0.1111111111111111, "#46039f"], [0.2222222222222222, "#7201a8"], [0.3333333333333333, "#9c179e"], [0.4444444444444444, "#bd3786"], [0.5555555555555556, "#d8576b"], [0.6666666666666666, "#ed7953"], [0.7777777777777778, "#fb9f3a"], [0.8888888888888888, "#fdca26"], [1.0, "#f0f921"]], "type": "histogram2dcontour"}], "mesh3d": [{"colorbar": {"outlinewidth": 0, "ticks": ""}, "type": "mesh3d"}], "parcoords": [{"line": {"colorbar": {"outlinewidth": 0, "ticks": ""}}, "type": "parcoords"}], "pie": [{"automargin": true, "type": "pie"}], "scatter": [{"marker": {"colorbar": {"outlinewidth": 0, "ticks": ""}}, "type": "scatter"}], "scatter3d": [{"line": {"colorbar": {"outlinewidth": 0, "ticks": ""}}, "marker": {"colorbar": {"outlinewidth": 0, "ticks": ""}}, "type": "scatter3d"}], "scattercarpet": [{"marker": {"colorbar": {"outlinewidth": 0, "ticks": ""}}, "type": "scattercarpet"}], "scattergeo": [{"marker": {"colorbar": {"outlinewidth": 0, "ticks": ""}}, "type": "scattergeo"}], "scattergl": [{"marker": {"colorbar": {"outlinewidth": 0, "ticks": ""}}, "type": "scattergl"}], "scattermapbox": [{"marker": {"colorbar": {"outlinewidth": 0, "ticks": ""}}, "type": "scattermapbox"}], "scatterpolar": [{"marker": {"colorbar": {"outlinewidth": 0, "ticks": ""}}, "type": "scatterpolar"}], "scatterpolargl": [{"marker": {"colorbar": {"outlinewidth": 0, "ticks": ""}}, "type": "scatterpolargl"}], "scatterternary": [{"marker": {"colorbar": {"outlinewidth": 0, "ticks": ""}}, "type": "scatterternary"}], "surface": [{"colorbar": {"outlinewidth": 0, "ticks": ""}, "colorscale": [[0.0, "#0d0887"], [0.1111111111111111, "#46039f"], [0.2222222222222222, "#7201a8"], [0.3333333333333333, "#9c179e"], [0.4444444444444444, "#bd3786"], [0.5555555555555556, "#d8576b"], [0.6666666666666666, "#ed7953"], [0.7777777777777778, "#fb9f3a"], [0.8888888888888888, "#fdca26"], [1.0, "#f0f921"]], "type": "surface"}], "table": [{"cells": {"fill": {"color": "#EBF0F8"}, "line": {"color": "white"}}, "header": {"fill": {"color": "#C8D4E3"}, "line": {"color": "white"}}, "type": "table"}]}, "layout": {"annotationdefaults": {"arrowcolor": "#2a3f5f", "arrowhead": 0, "arrowwidth": 1}, "autotypenumbers": "strict", "coloraxis": {"colorbar": {"outlinewidth": 0, "ticks": ""}}, "colorscale": {"diverging": [[0, "#8e0152"], [0.1, "#c51b7d"], [0.2, "#de77ae"], [0.3, "#f1b6da"], [0.4, "#fde0ef"], [0.5, "#f7f7f7"], [0.6, "#e6f5d0"], [0.7, "#b8e186"], [0.8, "#7fbc41"], [0.9, "#4d9221"], [1, "#276419"]], "sequential": [[0.0, "#0d0887"], [0.1111111111111111, "#46039f"], [0.2222222222222222, "#7201a8"], [0.3333333333333333, "#9c179e"], [0.4444444444444444, "#bd3786"], [0.5555555555555556, "#d8576b"], [0.6666666666666666, "#ed7953"], [0.7777777777777778, "#fb9f3a"], [0.8888888888888888, "#fdca26"], [1.0, "#f0f921"]], "sequentialminus": [[0.0, "#0d0887"], [0.1111111111111111, "#46039f"], [0.2222222222222222, "#7201a8"], [0.3333333333333333, "#9c179e"], [0.4444444444444444, "#bd3786"], [0.5555555555555556, "#d8576b"], [0.6666666666666666, "#ed7953"], [0.7777777777777778, "#fb9f3a"], [0.8888888888888888, "#fdca26"], [1.0, "#f0f921"]]}, "colorway": ["#636efa", "#EF553B", "#00cc96", "#ab63fa", "#FFA15A", "#19d3f3", "#FF6692", "#B6E880", "#FF97FF", "#FECB52"], "font": {"color": "#2a3f5f"}, "geo": {"bgcolor": "white", "lakecolor": "white", "landcolor": "#E5ECF6", "showlakes": true, "showland": true, "subunitcolor": "white"}, "hoverlabel": {"align": "left"}, "hovermode": "closest", "mapbox": {"style": "light"}, "paper_bgcolor": "white", "plot_bgcolor": "#E5ECF6", "polar": {"angularaxis": {"gridcolor": "white", "linecolor": "white", "ticks": ""}, "bgcolor": "#E5ECF6", "radialaxis": {"gridcolor": "white", "linecolor": "white", "ticks": ""}}, "scene": {"xaxis": {"backgroundcolor": "#E5ECF6", "gridcolor": "white", "gridwidth": 2, "linecolor": "white", "showbackground": true, "ticks": "", "zerolinecolor": "white"}, "yaxis": {"backgroundcolor": "#E5ECF6", "gridcolor": "white", "gridwidth": 2, "linecolor": "white", "showbackground": true, "ticks": "", "zerolinecolor": "white"}, "zaxis": {"backgroundcolor": "#E5ECF6", "gridcolor": "white", "gridwidth": 2, "linecolor": "white", "showbackground": true, "ticks": "", "zerolinecolor": "white"}}, "shapedefaults": {"line": {"color": "#2a3f5f"}}, "ternary": {"aaxis": {"gridcolor": "white", "linecolor": "white", "ticks": ""}, "baxis": {"gridcolor": "white", "linecolor": "white", "ticks": ""}, "bgcolor": "#E5ECF6", "caxis": {"gridcolor": "white", "linecolor": "white", "ticks": ""}}, "title": {"x": 0.05}, "xaxis": {"automargin": true, "gridcolor": "white", "linecolor": "white", "ticks": "", "title": {"standoff": 15}, "zerolinecolor": "white", "zerolinewidth": 2}, "yaxis": {"automargin": true, "gridcolor": "white", "linecolor": "white", "ticks": "", "title": {"standoff": 15}, "zerolinecolor": "white", "zerolinewidth": 2}}}, "title": {"text": "Samples and True Function. Dataset 0"}, "width": 600, "xaxis": {"range": [0, 10]}, "yaxis": {"range": [-4, 20]}}}, "text/html": "
"}}]}}}, "version_major": 2, "version_minor": 0}